10 research outputs found

    Legal situation and current practice of waste incineration bottom ash utilisation in Europe

    Get PDF
    Almost 500 municipal solid waste incineration plants in the EU, Norway, and Switzerland generate about 17.6 Mt/a of incinerator bottom ash (IBA). IBA contains minerals and metals. Metals are mostly separated and sold to the scrap market and minerals are either disposed of in landfills or utilised in the construction sector. Since there is no uniform regulation for IBA utilisation at EU level, countries developed own rules with varying requirements for utilisation. As a result from a cooperation network between European experts an up-to-date overview of documents regulating IBA utilisation is presented. Furthermore, this work highlights the different requirements that have to be considered. Overall, 51 different parameters for the total content and 36 different parameters for the emission by leaching are defined. An analysis of the defined parameter reveals that leaching parameters are significantly more to be considered compared to total content parameters. In order to assess the leaching behaviour nine different leaching tests, including batch tests, up-flow percolation tests and one diffusion test (monolithic materials) are in place. A further discussion of leaching parameters showed that certain countries took over limit values initially defined for landfills for inert waste and adopted them for IBA utilisation. The overall utilisation rate of IBA in construction works is approximately 54 wt.%. It is revealed that the rate of utilisation does not necessarily depend on how well regulated IBA utilisation is, but rather seems to be a result of political commitment for IBA recycling and economically interesting circumstances

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    Graph neural networks at the Large Hadron Collider

    No full text
    International audienceFrom raw detector activations to reconstructed particles, data at the Large Hadron Collider (LHC) are sparse, irregular, heterogeneous and highly relational in nature. Graph neural networks (GNNs), a class of algorithms belonging to the rapidly growing field of geometric deep learning (GDL), are well suited to tackling such data because GNNs are equipped with relational inductive biases that explicitly make use of localized information encoded in graphs. Furthermore, graphs offer a flexible and efficient alternative to rectilinear structures when representing sparse or irregular data, and can naturally encode heterogeneous information. For these reasons, GNNs have been applied to a number of LHC physics tasks including reconstructing particles from detector readouts and discriminating physics signals against background processes. We introduce and categorize these applications in a manner accessible to both physicists and non-physicists. Our explicit goal is to bridge the gap between the particle physics and GDL communities. After an accessible description of LHC physics, including theory, measurement, simulation and analysis, we overview applications of GNNs at the LHC. We conclude by highlighting technical challenges and future directions that may inspire further collaboration between the physics and GDL communities

    Towards a realistic track reconstruction algorithm based on graph neural networks for the HL-LHC

    No full text
    The physics reach of the HL-LHC will be limited by how efficiently the experiments can use the available computing resources, i.e. affordable software and computing are essential. The development of novel methods for charged particle reconstruction at the HL-LHC incorporating machine learning techniques or based entirely on machine learning is a vibrant area of research. In the past two years, algorithms for track pattern recognition based on graph neural networks (GNNs) have emerged as a particularly promising approach. Previous work mainly aimed at establishing proof of principle. In the present document we describe new algorithms that can handle complex realistic detectors. The new algorithms are implemented in ACTS, a common framework for tracking software. This work aims at implementing a realistic GNN-based algorithm that can be deployed in an HL-LHC experiment

    CHEP 2021: Preface to the Proceedings

    No full text
    The 25th International Conference on Computing in High Energy and Nuclear Physics (CHEP), organised by CERN, took place as a virtual event from 17–21 May 2021. The conference attracted 1144 registered participants from 46 different countries. There were 207 scientific presentations made over the 5 days of the conference. These were divided between 30 long talks and 2 keynotes, which were presented in plenary sessions; and 175 short talks, which were presented in parallel sessions

    System Performance and Cost Modelling in LHC computing

    No full text
    The increase in the scale of LHC computing expected for Run 3 and even more so for Run 4 (HL-LHC) over the next ten years will certainly require radical changes to the computing models and the data processing of the LHC experiments. Translating the requirements of the physics programmes into computing resource needs is a complicated process and subject to significant uncertainties. For this reason, WLCG has established a working group to develop methodologies and tools intended tocharacterise the LHC workloads, better understand their interaction with the computing infrastructure, calculate their cost in terms of resources and expenditure and assist experiments, sites and the WLCG project in the evaluation of their future choices. This working group started in November 2017 and has about 30 active participants representing experiments and sites. In this contribution we expose the activities, the results achieved and the future directions

    New developments in cost modeling for the LHC computing

    No full text
    The increase in the scale of LHC computing during Run 3 and Run 4 (HL-LHC) will certainly require radical changes to the computing models and the data processing of the LHC experiments. The working group established by WLCG and the HEP Software Foundation to investigate all aspects of the cost of computing and how to optimise them has continued producing results and improving our understanding of this process. In particular, experiments have developed more sophisticated ways to calculate their resource needs, we have a much more detailed process to calculate infrastructure costs. This includes studies on the impact of HPC and GPU based resources on meeting the computing demands. We have also developed and perfected tools to quantitatively study the performance of experiments workloads and we are actively collaborating with other activities related to data access, benchmarking and technology cost evolution. In this contribution we expose our recent developments and results and outline the directions of future work

    A Roadmap for HEP Software and Computing R&D for the 2020s

    No full text
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade
    corecore